Compressed Sensing over ℓp-balls: Minimax mean square error

نویسندگان

  • David L. Donoho
  • Iain Johnstone
  • Arian Maleki
  • Andrea Montanari
چکیده

We consider the compressed sensing problem, where the object x0 ∈ R is to be recovered from incomplete measurements y = Ax0 + z; here the sensing matrix A is an n × N random matrix with iid Gaussian entries and n < N . A popular method of sparsity-promoting reconstruction is `-penalized least-squares reconstruction (aka LASSO, Basis Pursuit). It is currently popular to consider the strict sparsity model, where the object x0 is nonzero in only a small fraction of entries. In this paper, we instead consider the much more broadly applicable `p-sparsity model, where x0 is sparse in the sense of having `p norm bounded by ξ ·N for some fixed 0 < p ≤ 1 and ξ > 0. We study an asymptotic regime in which n and N both tend to infinity with limiting ratio n/N = δ ∈ (0, 1), both in the noisy (z 6= 0) and noiseless (z = 0) cases. Under weak assumptions on x0, we are able to precisely evaluate the worst-case asymptotic minimax mean-squared reconstruction error (AMSE) for ` penalized least-squares: min over penalization parameters, max over `p-sparse objects x0. We exhibit the asymptotically least-favorable object (hardest sparse signal to recover) and the maximin penalization. In the case where n/N tends to zero slowly – i.e. extreme undersampling – our formulas (normalized for comparison) say that the minimax AMSE of `1 penalized least-squares is asymptotic to ξ · ( 2 log(N/n) n ) 2/p−1 · (1 + o(1)). Thus we have not only the rate but also the constant factor on the AMSE; and the maximin penalty factor needed to attain this performance is also precisely specified. Other similarly precise calculations are showcased. Our explicit formulas unexpectedly involve quantities appearing classically in statistical decision theory. Occurring in the present setting, they reflect a deeper connection between penalized ` minimization and scalar soft thresholding. This connection, which follows from earlier work of the authors and collaborators on the AMP iterative thresholding algorithm, is carefully explained. Our approach also gives precise results under weak-`p ball coefficient constraints, as we show here.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Compressed Sensing over $\ell_p$-balls: Minimax Mean Square Error

We consider the compressed sensing problem, where the object x0 ∈ R is to be recovered from incomplete measurements y = Ax0 + z; here the sensing matrix A is an n × N random matrix with iid Gaussian entries and n < N . A popular method of sparsity-promoting reconstruction is `-penalized least-squares reconstruction (aka LASSO, Basis Pursuit). It is currently popular to consider the strict spars...

متن کامل

The Gelfand widths of ℓp-balls for 0<p≤1

We provide sharp lower and upper bounds for the Gelfand widths of lp-balls in the N -dimensional lNq -space for 0 < p ≤ 1 and p < q ≤ 2. Such estimates are highly relevant to the novel theory of compressive sensing, and our proofs rely on methods from this area.

متن کامل

Neighborhoods as Nuisance Parameters? Robustness vs. Semiparametrics

Deviations from the center within a robust neighborhood may naturally be considered an infinite dimensional nuisance parameter. Thus, the semiparametric method may be tried, which is to compute the scores function for the main parameter minus its orthogonal projection on the closed linear tangent space for the nuisance parameter, and then rescale for Fisher consistency. We derive such a semipar...

متن کامل

Neighborhoods as Nuisance Parameters

Deviations from the center within a robust neighborhood may naturally be considered an innnite dimensional nuisance parameter. Thus, in principle, the semiparametric method may be tried, which is to compute the scores function for the main parameter minus its orthogonal projection on the closed linear tangent space for the nuisance parameter, and then rescale for Fisher consistency. We derive s...

متن کامل

Minimax Bayes, asymptotic minimax and sparse wavelet priors

Pinsker(1980) gave a precise asymptotic evaluation of the minimax mean squared error of estimation of a signal in Gaussian noise when the signal is known a priori to lie in a compact ellipsoid in Hilbert space. This `Minimax Bayes' method can be applied to a variety of global non-parametric estimation settings with parameter spaces far from ellipsoidal. For example it leads to a theory of exact...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011